# Efficient Transformer
Hit Srf 4x Df2k
HiT-SR is a general strategy for enhancing transformer-based super-resolution methods, improving existing model structures to achieve better super-resolution performance while reducing computational burden.
Image Enhancement
H
XiangZ
27
0
Sonics Spectttra Gamma 5s
MIT
Advanced model for detecting end-to-end AI-generated songs, particularly skilled at capturing long-term audio features
Audio Classification English
S
awsaf49
119
0
Gliner Biomed Bi Large V1.0
Apache-2.0
GLiNER-BioMed is an efficient open NER model suite based on the GLiNER framework, specifically designed for the biomedical domain to recognize various types of biomedical entities.
Sequence Labeling English
G
Ihor
56
1
Gliner Biomed Bi Base V1.0
Apache-2.0
GLiNER-BioMed is an efficient open biomedical named entity recognition model suite based on the GLiNER framework, specifically designed for the biomedical domain, capable of recognizing multiple entity types.
Sequence Labeling English
G
Ihor
25
1
Gliner Biomed Large V1.0
Apache-2.0
GLiNER-BioMed is a specialized and efficient open biomedical NER model suite based on the GLiNER framework, achieving state-of-the-art zero-shot and few-shot performance in biomedical entity recognition tasks.
Sequence Labeling English
G
Ihor
163
6
Janus Pro 1B
MIT
Janus-Pro is a novel autoregressive framework that unifies multimodal understanding and generation capabilities. By decoupling visual encoding paths, it uses a single Transformer architecture to handle multimodal tasks.
Text-to-Image
Transformers

J
deepseek-ai
34.02k
432
Deepseek Llm 7B Base AWQ
Other
Deepseek LLM 7B Base is a 7B-parameter foundational large language model optimized for inference efficiency using AWQ quantization technology.
Large Language Model
Transformers

D
TheBloke
1,863
2
Ced Tiny
Apache-2.0
CED-Tiny is a simple audio tagging model based on ViT-Transformer, achieving state-of-the-art performance on Audioset.
Audio Classification
Transformers

C
mispeech
54
2
Ced Small
Apache-2.0
CED is a simple audio tagging model based on ViT-Transformer, achieving state-of-the-art performance on Audioset.
Audio Classification
Transformers

C
mispeech
18
0
Long T5 Tglobal Xl
Apache-2.0
LongT5 is a Transformer-based text-to-text model specifically designed to handle long sequence inputs, supporting up to 16,384 tokens.
Large Language Model
Transformers English

L
google
336
23
Xlm Roberta Longformer Base 4096
Apache-2.0
A long-sequence processing model based on XLM-R extension, supporting sequences up to 4096 tokens, suitable for multilingual tasks
Large Language Model
Transformers Other

X
markussagen
9,499
37
Electra Large Generator
Apache-2.0
ELECTRA is an efficient self-supervised language representation learning method that replaces traditional generative pretraining with discriminative pretraining, significantly improving computational efficiency.
Large Language Model English
E
google
473
8
Fnet Base Finetuned Qqp
Apache-2.0
This model is a fine-tuned version of google/fnet-base on the GLUE QQP dataset for text classification tasks, specifically targeting the problem of detecting duplicate Quora question pairs.
Text Classification
Transformers English

F
gchhablani
14
0
Fnet Base Finetuned Sst2
Apache-2.0
A text classification model based on Google's FNet architecture fine-tuned on the SST-2 sentiment analysis dataset
Text Classification English
F
gchhablani
16
1
Bigbird Roberta Large
Apache-2.0
BigBird is a Transformer model based on sparse attention, capable of processing sequences up to 4096 tokens long, suitable for long document tasks.
Large Language Model English
B
google
1,152
27
Distilbert Base En Es Pt Cased
Apache-2.0
This is a compact version of distilbert-base-multilingual-cased, supporting English, Spanish, and Portuguese, while maintaining the original model's accuracy.
Large Language Model
Transformers Other

D
Geotrend
17
0
Xlm Roberta Longformer Base 4096
Apache-2.0
Extended XLM-RoBERTa model supporting sequences up to 4096 tokens, suitable for multilingual tasks
Large Language Model
Transformers Other

X
Peltarion
64
8
Albert Xxlarge V1
Apache-2.0
ALBERT XXLarge v1 is a Transformer model pretrained on English corpus using Masked Language Modeling (MLM) objective with parameter-sharing features.
Large Language Model
Transformers English

A
albert
930
5
Featured Recommended AI Models